11 research outputs found
Proceedings of the second "international Traveling Workshop on Interactions between Sparse models and Technology" (iTWIST'14)
The implicit objective of the biennial "international - Traveling Workshop on
Interactions between Sparse models and Technology" (iTWIST) is to foster
collaboration between international scientific teams by disseminating ideas
through both specific oral/poster presentations and free discussions. For its
second edition, the iTWIST workshop took place in the medieval and picturesque
town of Namur in Belgium, from Wednesday August 27th till Friday August 29th,
2014. The workshop was conveniently located in "The Arsenal" building within
walking distance of both hotels and town center. iTWIST'14 has gathered about
70 international participants and has featured 9 invited talks, 10 oral
presentations, and 14 posters on the following themes, all related to the
theory, application and generalization of the "sparsity paradigm":
Sparsity-driven data sensing and processing; Union of low dimensional
subspaces; Beyond linear and convex inverse problem; Matrix/manifold/graph
sensing/processing; Blind inverse problems and dictionary learning; Sparsity
and computational neuroscience; Information theory, geometry and randomness;
Complexity/accuracy tradeoffs in numerical methods; Sparsity? What's next?;
Sparse machine learning and inference.Comment: 69 pages, 24 extended abstracts, iTWIST'14 website:
http://sites.google.com/site/itwist1
Le magmatisme de la rĂ©gion de Kwyjibo, Province\ud du Grenville (Canada) : intĂ©rĂȘt pour les\ud minĂ©ralisations de type fer-oxydes associĂ©es
The granitic plutons located north of the Kwyjibo property in Quebecâs Grenville Province are of\ud
Mesoproterozoic age and belong to the granitic Canatiche Complex . The rocks in these plutons are calc-alkalic, K-rich,\ud
and meta- to peraluminous. They belong to the magnetite series and their trace element characteristics link them to\ud
intraplate granites. They were emplaced in an anorogenic, subvolcanic environment, but they subsequently underwent\ud
significant ductile deformation. The magnetite, copper, and fluorite showings on the Kwyjibo property are polyphased\ud
and premetamorphic; their formation began with the emplacement of hydraulic, magnetite-bearing breccias, followed by\ud
impregnations and veins of chalcopyrite, pyrite, and fluorite, and ended with a late phase of mineralization, during\ud
which uraninite, rare earths, and hematite were emplaced along brittle structures. The plutons belong to two families:\ud
biotite-amphibole granites and leucogranites. The biotite-amphibole granites are rich in iron and represent a potential\ud
heat and metal source for the first, iron oxide phase of mineralization. The leucogranites show a primary enrichment in\ud
REE (rare-earth elements), F, and U, carried mainly in Y-, U-, and REE-bearing niobotitanates. They are metamict and\ud
underwent a postmagmatic alteration that remobilized the uranium and the rare earths. The leucogranites could also be\ud
a source of rare earths and uranium for the latest mineralizing events
Convolutional Transform learning
International audienceThis work proposes a new representation learning technique called convolutional transform learning. In standard transform learning, a dense basis is learned that analyses the image to generate the representation from the image. Here, we learn a set of independent convolutional filters that operate on the images to produce representations (one corresponding to each filter). The major advantage of our proposed approach is that it is completely unsupervised; unlike CNNs where labeled images are required for training. Moreover, it relies on a well-sounded minimization technique with established convergence guarantees. We have compared the proposed method with dictionary learning and transform learning on standard image classification datasets. Results show that our method improves over the rest by a considerable margin
Deep Convolutional Transform Learning
International audienceThis work introduces a new unsupervised representation learning technique called Deep Convolutional Transform Learning (DCTL). By stacking convolutional transforms, our approach is able to learn a set of independent kernels at different layers. The features extracted in an unsupervised manner can then be used to perform machine learning tasks, such as classification and clustering. The learning technique relies on a well-sounded alternating proximal minimization scheme with established convergence guarantees. Our experimental results show that the proposed DCTL technique outperforms its shallow version CTL, on several benchmark datasets